# Autoregressive Architecture

Codellama 34b Python Hf
Code Llama is a series of pre-trained generative text models released by Meta, specifically designed for code generation and understanding tasks. This model is a 34-billion-parameter Python-specific version.
Large Language Model Transformers Other
C
meta-llama
414
9
Codellama 34b Hf
Code Llama is a series of code generation and understanding models released by Meta, ranging from 7 billion to 34 billion parameters. This version is the 34 billion parameter base model.
Large Language Model Transformers Other
C
meta-llama
492
15
Codellama 13b Python Hf
Code Llama is a series of large models for code generation and understanding developed by Meta. This version is a 13B-parameter variant specialized for Python.
Large Language Model Transformers Other
C
codellama
2,230
51
Codellama 13b Hf
Code Llama 13B is a 13-billion-parameter generative code model designed for general-purpose code synthesis and understanding
Large Language Model Transformers Other
C
codellama
6,829
107
Llama 7b Hf Transformers 4.29
Other
LLaMA is an efficient foundational language model based on the Transformer architecture developed by Meta AI, offering a 7B parameter version that supports multiple language processing tasks.
Large Language Model Transformers
L
elinas
4,660
57
Llama 7b Hf
Other
LLaMA is an open and efficient foundational language model developed by Meta AI, offering a 7B-parameter version that supports 20 languages, focusing on natural language processing research.
Large Language Model Transformers
L
yahma
22.30k
85
Llama 7b Embeddings
Other
An open-source, efficient foundational language model with 7B parameters developed by Meta AI, based on Transformer architecture, supporting multilingual capabilities with a primary focus on English
Large Language Model Transformers
L
shalomma
36
28
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase